257 research outputs found
Tropical secant graphs of monomial curves
The first secant variety of a projective monomial curve is a threefold with
an action by a one-dimensional torus. Its tropicalization is a
three-dimensional fan with a one-dimensional lineality space, so the tropical
threefold is represented by a balanced graph. Our main result is an explicit
construction of that graph. As a consequence, we obtain algorithms to
effectively compute the multidegree and Chow polytope of an arbitrary
projective monomial curve. This generalizes an earlier degree formula due to
Ranestad. The combinatorics underlying our construction is rather delicate, and
it is based on a refinement of the theory of geometric tropicalization due to
Hacking, Keel and Tevelev.Comment: 30 pages, 8 figures. Major revision of the exposition. In particular,
old Sections 4 and 5 are merged into a single section. Also, added Figure 3
and discussed Chow polytopes of rational normal curves in Section
Toward a Robust Sparse Data Representation for Wireless Sensor Networks
Compressive sensing has been successfully used for optimized operations in
wireless sensor networks. However, raw data collected by sensors may be neither
originally sparse nor easily transformed into a sparse data representation.
This paper addresses the problem of transforming source data collected by
sensor nodes into a sparse representation with a few nonzero elements. Our
contributions that address three major issues include: 1) an effective method
that extracts population sparsity of the data, 2) a sparsity ratio guarantee
scheme, and 3) a customized learning algorithm of the sparsifying dictionary.
We introduce an unsupervised neural network to extract an intrinsic sparse
coding of the data. The sparse codes are generated at the activation of the
hidden layer using a sparsity nomination constraint and a shrinking mechanism.
Our analysis using real data samples shows that the proposed method outperforms
conventional sparsity-inducing methods.Comment: 8 page
Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications
Wireless sensor networks monitor dynamic environments that change rapidly
over time. This dynamic behavior is either caused by external factors or
initiated by the system designers themselves. To adapt to such conditions,
sensor networks often adopt machine learning techniques to eliminate the need
for unnecessary redesign. Machine learning also inspires many practical
solutions that maximize resource utilization and prolong the lifespan of the
network. In this paper, we present an extensive literature review over the
period 2002-2013 of machine learning methods that were used to address common
issues in wireless sensor networks (WSNs). The advantages and disadvantages of
each proposed algorithm are evaluated against the corresponding problem. We
also provide a comparative guide to aid WSN designers in developing suitable
machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial
Rate-distortion Balanced Data Compression for Wireless Sensor Networks
This paper presents a data compression algorithm with error bound guarantee
for wireless sensor networks (WSNs) using compressing neural networks. The
proposed algorithm minimizes data congestion and reduces energy consumption by
exploring spatio-temporal correlations among data samples. The adaptive
rate-distortion feature balances the compressed data size (data rate) with the
required error bound guarantee (distortion level). This compression relieves
the strain on energy and bandwidth resources while collecting WSN data within
tolerable error margins, thereby increasing the scale of WSNs. The algorithm is
evaluated using real-world datasets and compared with conventional methods for
temporal and spatial data compression. The experimental validation reveals that
the proposed algorithm outperforms several existing WSN data compression
methods in terms of compression efficiency and signal reconstruction. Moreover,
an energy analysis shows that compressing the data can reduce the energy
expenditure, and hence expand the service lifespan by several folds.Comment: arXiv admin note: text overlap with arXiv:1408.294
New Characterizations and Efficient Local Search for General Integer Linear Programming
Integer linear programming (ILP) models a wide range of practical
combinatorial optimization problems and has significant impacts in industry and
management sectors. This work proposes new characterizations of ILP with the
concept of boundary solutions. Motivated by the new characterizations, we
develop an efficient local search solver, which is the first local search
solver for general ILP validated on a large heterogeneous problem dataset. We
propose a new local search framework that switches between three modes, namely
Search, Improve, and Restore modes. We design tailored operators adapted to
different modes, thus improving the quality of the current solution according
to different situations. For the Search and Restore modes, we propose an
operator named tight move, which adaptively modifies variables' values, trying
to make some constraint tight. For the Improve mode, an efficient operator lift
move is proposed to improve the quality of the objective function while
maintaining feasibility. Putting these together, we develop a local search
solver for integer linear programming called Local-ILP. Experiments conducted
on the MIPLIB dataset show the effectiveness of our solver in solving
large-scale hard integer linear programming problems within a reasonably short
time. Local-ILP is competitive and complementary to the state-of-the-art
commercial solver Gurobi and significantly outperforms the state-of-the-art
non-commercial solver SCIP. Moreover, our solver establishes new records for 6
MIPLIB open instances. The theoretical analysis of our algorithm is also
presented, which shows our algorithm could avoid visiting unnecessary regions
and also maintain good connectivity of targeted solutions.Comment: 36 pages, 2 figures, 7 table
- …